Biplots in Reduced-Rank Regression

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Biplots in Reduced - Rank Regression

2 SUMMARY Regression problems with a number of related response variables are typically analyzed by separate multiple regressions. This paper shows how these regressions can be visualized jointly in a biplot based on reduced-rank regression. Reduced-rank regression combines multiple regression and principal components analysis and can therefore be carried out with standard statistical packages....

متن کامل

Nonparametric Reduced Rank Regression

We propose an approach to multivariate nonparametric regression that generalizes reduced rank regression for linear models. An additive model is estimated for each dimension of a q-dimensional response, with a shared p-dimensional predictor variable. To control the complexity of the model, we employ a functional form of the Ky-Fan or nuclear norm, resulting in a set of function estimates that h...

متن کامل

Partially Linear Reduced-rank Regression

We introduce a new dimension-reduction technique, the Partially Linear Reduced-rank Regression (PLRR) model, for exploring possible nonlinear structure in a regression involving both multivariate response and covariate. The PLRR model specifies that the response vector loads linearly on some linear indices of the covariate, and nonlinearly on some other indices of the covariate. We give a set o...

متن کامل

Reduced rank regression in Bayesian FDA

In functional data analysis (FDA) it is of interest to generalize techniques of multivariate analysis like canonical correlation analysis or regression to functions which are often observed with noise. In the proposed Bayesian approach to FDA two tools are combined: (i) a special Demmler-Reinsch like basis of interpolation splines to represent functions parsimoniously and ‡exibly; (ii) latent v...

متن کامل

Cs 229 Final Project Reduced Rank Regression

where A is an unknown p× n matrix of coefficients and E is an unobserved m× n random noise matrix with independent mean zero and variance σ. We want to find an estimate  such that ||Y −XÂ|| is small. If we use standard least square estimation directly to estimate A in (1.1) without adding any constraints, then it is just the same as regressing each response on the predictors separately. In thi...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Biometrical Journal

سال: 1994

ISSN: 0323-3847,1521-4036

DOI: 10.1002/bimj.4710360812